# T5 architecture optimization
Chronos Bolt Small
Apache-2.0
Chronos-Bolt is a series of pretrained time series forecasting models supporting zero-shot prediction, based on T5 encoder-decoder architecture, trained on nearly 100 billion time series observations.
Climate Model
C
amazon
340.81k
10
T5 3b Q4 K M GGUF
Apache-2.0
This model is a quantized version converted from google-t5/t5-3b to GGUF format using llama.cpp via ggml.ai's GGUF-my-repo space.
Machine Translation Supports Multiple Languages
T
VVS2024
15
0
Kazrush Kk Ru
Apache-2.0
kazRush-kk-ru is a Kazakh-to-Russian translation model based on the T5 configuration, trained on multiple parallel datasets.
Machine Translation
Transformers Other

K
deepvk
2,630
8
Biot5 Plus Large
MIT
BioT5+ is a large biomedical language model based on the T5 architecture, achieving generalized biological understanding through IUPAC integration and multitask fine-tuning.
Large Language Model
Transformers English

B
QizhiPei
52
2
Ptt5 V2 Base
Apache-2.0
ptt5-v2 is a series of Portuguese pre-trained T5 models, continued training based on Google's original checkpoint.
Large Language Model
Transformers Other

P
unicamp-dl
1,197
2
Ptt5 V2 Small
Apache-2.0
A pre-trained T5 model optimized for Portuguese, based on continued training from Google's original t5-small checkpoint
Large Language Model
Transformers Other

P
unicamp-dl
85
1
FRED T5 Summarizer
MIT
A Russian text summarization model developed by SberDevices, based on the T5 architecture with 1.7B parameters
Text Generation
Transformers Other

F
RussianNLP
11.76k
21
T5 Translate En Ru Zh Small 1024
Apache-2.0
A multilingual translation model based on T5 architecture, supporting bidirectional translation between Russian, Chinese, and English
Machine Translation
Transformers Supports Multiple Languages

T
utrobinmv
2,405
34
Madlad400 7b Mt
Apache-2.0
A multilingual machine translation model based on the T5 architecture, supporting 400+ languages, trained with 250 billion tokens
Machine Translation Supports Multiple Languages
M
google
4,450
15
Medical Mt5 Large
Apache-2.0
Medical mT5 is the first open-source multilingual text generation model for the medical domain, based on the mT5 checkpoint, supporting English, Spanish, French, and Italian.
Large Language Model
Transformers Supports Multiple Languages

M
HiTZ
3,019
21
Flan T5 Base Squad2
MIT
An extractive QA model fine-tuned on the SQuAD2.0 dataset based on flan-t5-base, capable of handling question-answer pairs including unanswerable questions.
Question Answering System
Transformers English

F
sjrhuschlee
2,425
4
Long T5 Tglobal Base Sci Simplify Elife
Apache-2.0
A model based on the Long-T5 architecture, specifically designed for generating popular summaries of scientific papers, transforming complex research content into text understandable by non-specialists.
Text Generation
Transformers English

L
pszemraj
22.98k
5
Chatgpt Paraphraser On T5 Base
Openrail
A text paraphrasing model trained on the T5-base architecture, capable of generating high-quality paraphrased text, claimed to be one of the best paraphrasing models on the Hugging Face platform
Text Generation
Transformers English

C
humarin
115.08k
185
FRED T5 Large
Apache-2.0
A Russian pre-trained language model based on the T5 architecture, employing a mixed training strategy with 7 denoisers similar to UL2, supporting various text generation tasks.
Large Language Model
Transformers Other

F
ai-forever
998
25
FRED T5 1.7B
Apache-2.0
Russian pre-trained language model based on T5 architecture, employing a UL2-like mixed training strategy with 7 denoising tasks, 1.7 billion parameters
Large Language Model
Transformers Other

F
ai-forever
1,671
77
Solidity T5
Apache-2.0
A T5-based Solidity smart contract code generation model designed for Web3 development
Text Generation
Transformers English

S
hululuzhu
141
11
Finabsa
Apache-2.0
FinABSA is a model trained on the T5-Large architecture for Aspect-Based Sentiment Analysis (ABSA) tasks. It can parse sentences containing multiple aspects. By replacing the target aspect with the [TGT] token, the model can focus on sentiment prediction for that specific aspect.
Large Language Model
Transformers English

F
amphora
146
5
Randeng T5 784M QA Chinese
The first Chinese generative Q&A pre-trained T5 model, pre-trained on WuDao 180G corpus and fine-tuned on Chinese SQuAD and CMRC2018 datasets
Question Answering System
Transformers Chinese

R
IDEA-CCNL
166
32
Sentence It5 Base
Italian sentence embedding model based on IT5, mapping text to a 512-dimensional vector space, suitable for semantic search and clustering tasks
Text Embedding
Transformers Other

S
efederici
86
4
Sentence It5 Small
A small Italian sentence transformation model based on T5 architecture, suitable for semantic search and text similarity calculation
Text Embedding
Transformers Other

S
efederici
83
0
Persian T5 Paraphraser
This is a text paraphrasing model for Persian language, built upon a Persian monolingual T5 model, capable of generating multiple paraphrased versions of Persian texts.
Text Generation
Transformers Other

P
erfan226
290
2
Plt5 Large
plT5 is a language model based on the T5 architecture, specifically trained on Polish corpus and optimized for the original T5 denoising tasks.
Large Language Model
Transformers Other

P
allegro
1,366
5
Gemini
MIT
A T5 architecture-based code summarization generation model supporting explanations for multiple programming languages
Large Language Model
Transformers English

G
describeai
844
43
Arat5 Base Title Generation
AraT5 is a series of text generation models specifically designed for Arabic, including versions for Modern Standard Arabic, Twitter dialect, and a general version
Large Language Model
Transformers Arabic

A
UBC-NLP
117
12
Code Trans T5 Small Code Documentation Generation Go Multitask
Go language code documentation generation model based on T5-small architecture, supporting multi-task processing
Text Generation
C
SEBIS
17
0
Ke T5 Large
Apache-2.0
A T5 model pre-trained on Korean and English, suitable for cross-lingual knowledge-driven response generation tasks
Large Language Model Supports Multiple Languages
K
KETI-AIR
147
8
Code Trans T5 Large Source Code Summarization Python Multitask Finetune
Pretrained model based on T5-large architecture, specifically designed for Python code summarization tasks with multi-task learning support
Text Generation
C
SEBIS
78
13
Code Trans T5 Base Code Documentation Generation Java Multitask
A pre-trained model based on the T5 architecture, specifically designed for generating Java function documentation with multi-task processing support
Text Generation
C
SEBIS
57
1
Code Trans T5 Base Code Documentation Generation Python Multitask Finetune
A Python code documentation generation model based on T5 architecture, pre-trained and fine-tuned with multi-task learning, specifically designed for generating Python function documentation
Text Generation
C
SEBIS
26
1
Code Trans T5 Base Code Documentation Generation Go
A T5-based Go language code documentation generation model specifically designed for generating descriptive documentation for Go functions
Large Language Model
C
SEBIS
18
0
Code Trans T5 Base Code Documentation Generation Go Multitask Finetune
A T5 architecture-based Go language code documentation generation model, pre-trained and fine-tuned with multi-task learning, specifically designed for generating documentation for Go functions/methods.
Text Generation
C
SEBIS
15
0
T5 Base En Generate Headline
A T5 model trained on 500,000 articles with titles, used to generate concise and accurate single-line titles for given articles.
Text Generation
T
Michau
26.72k
53
Code Trans T5 Base Commit Generation
Git commit message generation model based on T5 base architecture, optimized for tokenized Git commits
Text Generation
C
SEBIS
15
1
Code Trans T5 Base Program Synthese
Lisp-style DSL code generation model based on T5 architecture for converting natural language descriptions into program code
Large Language Model
C
SEBIS
16
0
Code Trans T5 Base Code Comment Generation Java Multitask Finetune
A Java code comment generation model based on the T5 architecture, optimized through multi-task pre-training and fine-tuning, specifically designed to generate descriptive text for Java functions.
Large Language Model
C
SEBIS
16
0
Rut5 Base Sum Gazeta
Apache-2.0
A Russian abstractive summarization model based on rut5-base, optimized for Russian news summarization tasks
Text Generation
Transformers Other

R
IlyaGusev
3,640
13
Featured Recommended AI Models